Telegram Group & Telegram Channel
💠 Compositional Learning Journal Club

Join us this week for an in-depth discussion on Compositional Learning in the context of cutting-edge text-to-image generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle compositional tasks and where improvements can be made.

🌟 This Week's Presentation:

📌 Title:
A Cat Is A Cat (Not A Dog!): Unraveling Information Mix-ups in Text-to-Image Encoders through Causal Analysis and Embedding Optimization

🎙️ Presenter: Amir Kasaei

🧠 Abstract:
This work presents an in-depth analysis of the causal structure in the text encoder of text-to-image (T2I) diffusion models, highlighting its role in introducing information bias and loss. While prior research has mainly addressed these issues during the denoising stage, this study focuses on the underexplored contribution of text embeddings—particularly in multi-object generation scenarios. The authors investigate how text embeddings influence the final image output and why models often favor the first-mentioned object, leading to imbalanced representations. To mitigate this, they propose a training-free text embedding balance optimization method that improves information balance in Stable Diffusion by 125.42%. Additionally, a new automatic evaluation metric is introduced, offering a more accurate assessment of information loss with an 81% concordance rate with human evaluations. This metric better captures object presence and accuracy compared to existing measures like CLIP-based text-image similarity scores.

📄 Paper:
A Cat Is A Cat (Not A Dog!): Unraveling Information Mix-ups in Text-to-Image Encoders through Causal Analysis and Embedding Optimization

Session Details:
- 📅 Date: Tuesday
- 🕒 Time: 5:00 - 6:00 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️



tg-me.com/RIMLLab/211
Create:
Last Update:

💠 Compositional Learning Journal Club

Join us this week for an in-depth discussion on Compositional Learning in the context of cutting-edge text-to-image generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle compositional tasks and where improvements can be made.

🌟 This Week's Presentation:

📌 Title:
A Cat Is A Cat (Not A Dog!): Unraveling Information Mix-ups in Text-to-Image Encoders through Causal Analysis and Embedding Optimization

🎙️ Presenter: Amir Kasaei

🧠 Abstract:
This work presents an in-depth analysis of the causal structure in the text encoder of text-to-image (T2I) diffusion models, highlighting its role in introducing information bias and loss. While prior research has mainly addressed these issues during the denoising stage, this study focuses on the underexplored contribution of text embeddings—particularly in multi-object generation scenarios. The authors investigate how text embeddings influence the final image output and why models often favor the first-mentioned object, leading to imbalanced representations. To mitigate this, they propose a training-free text embedding balance optimization method that improves information balance in Stable Diffusion by 125.42%. Additionally, a new automatic evaluation metric is introduced, offering a more accurate assessment of information loss with an 81% concordance rate with human evaluations. This metric better captures object presence and accuracy compared to existing measures like CLIP-based text-image similarity scores.

📄 Paper:
A Cat Is A Cat (Not A Dog!): Unraveling Information Mix-ups in Text-to-Image Encoders through Causal Analysis and Embedding Optimization

Session Details:
- 📅 Date: Tuesday
- 🕒 Time: 5:00 - 6:00 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/211

View MORE
Open in Telegram


RIML Lab Telegram | DID YOU KNOW?

Date: |

Should I buy bitcoin?

“To the extent it is used I fear it’s often for illicit finance. It’s an extremely inefficient way of conducting transactions, and the amount of energy that’s consumed in processing those transactions is staggering,” the former Fed chairwoman said. Yellen’s comments have been cited as a reason for bitcoin’s recent losses. However, Yellen’s assessment of bitcoin as a inefficient medium of exchange is an important point and one that has already been raised in the past by bitcoin bulls. Using a volatile asset in exchange for goods and services makes little sense if the asset can tumble 10% in a day, or surge 80% over the course of a two months as bitcoin has done in 2021, critics argue. To put a finer point on it, over the past 12 months bitcoin has registered 8 corrections, defined as a decline from a recent peak of at least 10% but not more than 20%, and two bear markets, which are defined as falls of 20% or more, according to Dow Jones Market Data.

Importantly, that investor viewpoint is not new. It cycles in when conditions are right (and vice versa). It also brings the ineffective warnings of an overpriced market with it.Looking toward a good 2022 stock market, there is no apparent reason to expect these issues to change.

RIML Lab from ye


Telegram RIML Lab
FROM USA